Deep Learning With Python: Advanced and Effective Strategies of Using Deep Learning with Python Theories by Williams Ethan

Deep Learning With Python: Advanced and Effective Strategies of Using Deep Learning with Python Theories by Williams Ethan

Author:Williams, Ethan [Williams, Ethan]
Language: eng
Format: epub
Published: 2020-04-03T16:00:00+00:00


Different number of epochs are giving different results, in our case, ranging from 2.6 to 3.2. The entire purpose of the K-Fold validation is to give a mean of these different scores, which is 3.0 in our case. However, we are still deviating by an average of $3,000 and this is very significant.

We will now try training the network longer, this time for 500 epochs. In addition, we will modify the training session loops such that the performance of the model one each epoch is recorded in a validation score log.

The code to save the validation logs a each fold is as shown below:

num_epochs = 500

all_mae_histories = []

for i in range(k):

print('processing fold #', i)

val_data = train_data[i * num_val_samples: (i + 1) * num_val_samples]

val_targets = train_targets[i * num_val_samples: (i + 1) * num_val_samples]

partial_train_data = np.concatenate(

​ [train_data[:i * num_val_samples],

​ train_data[(i + 1) * num_val_samples:]],

​ axis=0)

partial_train_targets = np.concatenate(

​ [train_targets[:i * num_val_samples],

​ train_targets[(i + 1) * num_val_samples:]],

​ axis=0)

model = build_model()

history = model.fit(partial_train_data, partial_train_targets,

​ ​ validation_data=(val_data, val_targets),

​ ​ epochs=num_epochs, batch_size=1, verbose=0)

mae_history = history.history['val_mean_absolute_error']

all_mae_histories.append(mae_history)



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.